Minimizing the description length using steepest descent
نویسندگان
چکیده
Recently there has been much attention to MDL and its effectiveness in automatic shape modelling. One problem of this technique has been the slow convergence of the optimization step. In this paper the Jacobian of the objective function is derived. Being able to calculate the Jacobian, a variety of optimization techniques can be considered. In this paper we apply steepest descent and show that it is more efficient than the previously proposed Nelder-Mead Simplex optimization.
منابع مشابه
A Free Line Search Steepest Descent Method for Solving Unconstrained Optimization Problems
In this paper, we solve unconstrained optimization problem using a free line search steepest descent method. First, we propose a double parameter scaled quasi Newton formula for calculating an approximation of the Hessian matrix. The approximation obtained from this formula is a positive definite matrix that is satisfied in the standard secant relation. We also show that the largest eigen value...
متن کاملHybrid steepest-descent method with sequential and functional errors in Banach space
Let $X$ be a reflexive Banach space, $T:Xto X$ be a nonexpansive mapping with $C=Fix(T)neqemptyset$ and $F:Xto X$ be $delta$-strongly accretive and $lambda$- strictly pseudocotractive with $delta+lambda>1$. In this paper, we present modified hybrid steepest-descent methods, involving sequential errors and functional errors with functions admitting a center, which generate convergent sequences ...
متن کاملSteepest Descent
The steepest descent method has a rich history and is one of the simplest and best known methods for minimizing a function. While the method is not commonly used in practice due to its slow convergence rate, understanding the convergence properties of this method can lead to a better understanding of many of the more sophisticated optimization methods. Here, we give a short introduction and dis...
متن کاملResidual norm steepest descent based iterative algorithms for Sylvester tensor equations
Consider the following consistent Sylvester tensor equation[mathscr{X}times_1 A +mathscr{X}times_2 B+mathscr{X}times_3 C=mathscr{D},]where the matrices $A,B, C$ and the tensor $mathscr{D}$ are given and $mathscr{X}$ is the unknown tensor. The current paper concerns with examining a simple and neat framework for accelerating the speed of convergence of the gradient-based iterative algorithm and ...
متن کاملA new Levenberg-Marquardt approach based on Conjugate gradient structure for solving absolute value equations
In this paper, we present a new approach for solving absolute value equation (AVE) whichuse Levenberg-Marquardt method with conjugate subgradient structure. In conjugate subgradientmethods the new direction obtain by combining steepest descent direction and the previous di-rection which may not lead to good numerical results. Therefore, we replace the steepest descentdir...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2003